58 - Deep Learning - Plain Version 2020 [ID:21192]
50 von 193 angezeigt

Welcome back to Deep Learning. So today we're going to start talking about ideas

that are called self supervised learning.

Somehow we want to obtain labels by self supervision and we will look into what this term actually

It means what the core ideas are in next couple of videos.

So this is part 3 of weekly and self-supervised

learning and today we actually start talking about self-supervised learning.

There are a couple of videos around on self-supervised learning and you

can essentially split them two parts, you can say one is how to get the

labels, the self-supervised labels and the other part is that you work on

the losses in order to embed those labels so that those energy levels

of labels 10, 14 and 16 ruined and 넘ient.

Or it is pretty common to interfere with the

revolution will not be supervised.

This is very clearly visible in the following statement

by Jan Le sposshun.

Most of human and animal learning is unsupervised learning.

If intelligence was a cake,

unsupervised learning would be the cake.

Supervised learning would be the icing on the cake

and reinforcement learning would be the cherry on the cake.

And of course this is substantiated by observations

... modes of psychology, and how humans and animals learn.

So the idea of self-supervision is that you try to use information you already have about your problem to...

... come up with some surrogate label that allows you to do training processes.

The key ideas here on this slide by Yan Li跃 can be summarized as the following

... so you try to predict the future from the past, you can predict the future also from the...

Passed or you predict the past from the present or the top from the bottom.

Also, an option could be to predict the occluded from the visible.

So you pretend that there is a part

of the input that you don't know and predict that.

And this essentially allows you to come up with the surrogate task.

And if the surrogate task, you can already perform training.

And the nice thing is you don't need

any label at all because you intrinsically

use the structure of the data.

So essentially self-supervised learning

is an unsupervised learning approach.

But every now and then you need to make clear

that you're doing something new in a domain that has been

researched on for many decades.

So you may not refer to the term unsupervised anymore.

And Jan LeCun actually proposed the term self-supervised learning.

And he realized that unsupervised is a loaded and confusing term.

So although the ideas have already been around before the term

self-supervised learning has been established, it makes sense to use this

term to concentrate on a particular kind of unsupervised learning.

So you could say it's a subcategory of unsupervised learning.

It uses pretext surrogate or pseudo-tasks in a supervised fashion.

And this essentially means you can use all of the supervised learning methods and you

have labels that are automatically generated that can then be used as a measurement of

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

00:15:55 Min

Aufnahmedatum

2020-10-12

Hochgeladen am

2020-10-12 22:46:19

Sprache

en-US

Deep Learning - Weakly and Self-Supervised Learning Part 3

In this video, we look into the fundamental concepts of self-supervised learning. In particular, we look at different strategies to create surrogate labels from data automatically.

For reminders to watch the new video follow on Twitter or LinkedIn.

Further Reading:
A gentle Introduction to Deep Learning

Einbetten
Wordpress FAU Plugin
iFrame
Teilen